Definition of West Germany

  • (noun) a republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990

Synonyms of West Germany


Antonyms of West Germany


No Antonyms Found.

Homophones of West Germany


No Homophones Found.